style transfer method
Artistic Style Transfer with Internal-external Learning and Contrastive Learning
Although existing artistic style transfer methods have achieved significant improvement with deep neural networks, they still suffer from artifacts such as disharmonious colors and repetitive patterns. Motivated by this, we propose an internal-external style transfer method with two contrastive losses.
R4), has strength in visual quality (R1, R3, R4), and performs diverse (R1) and thorough (R3, R4) experiments
We sincerely thank our reviewers for the constructive feedback. R4), has strength in visual quality (R1, R3, R4), and performs diverse (R1) and thorough (R3, R4) experiments. Note that we do not claim to be the first to use swapping for disentanglement. We apologize for any confusion. Moreover, our method supports HD resolution (e.g. the mountain example Therefore, we believe our method has advantage over general texture transfer methods.
Artistic Style Transfer with Internal-external Learning and Contrastive Learning
Although existing artistic style transfer methods have achieved significant improvement with deep neural networks, they still suffer from artifacts such as disharmonious colors and repetitive patterns. Motivated by this, we propose an internal-external style transfer method with two contrastive losses. Specifically, we utilize internal statistics of a single style image to determine the colors and texture patterns of the stylized image, and in the meantime, we leverage the external information of the large-scale style dataset to learn the human-aware style information, which makes the color distributions and texture patterns in the stylized image more reasonable and harmonious. In addition, we argue that existing style transfer methods only consider the content-to-stylization and style-to-stylization relations, neglecting the stylization-to-stylization relations. To address this issue, we introduce two contrastive losses, which pull the multiple stylization embeddings closer to each other when they share the same content or style, but push far away otherwise. We conduct extensive experiments, showing that our proposed method can not only produce visually more harmonious and satisfying artistic images, but also promote the stability and consistency of rendered video clips.
R4), has strength in visual quality (R1, R3, R4), and performs diverse (R1) and thorough (R3, R4) experiments
We sincerely thank our reviewers for the constructive feedback. R4), has strength in visual quality (R1, R3, R4), and performs diverse (R1) and thorough (R3, R4) experiments. Note that we do not claim to be the first to use swapping for disentanglement. We apologize for any confusion. Moreover, our method supports HD resolution (e.g. the mountain example Therefore, we believe our method has advantage over general texture transfer methods.
Artistic Style Transfer with Internal-external Learning and Contrastive Learning
Although existing artistic style transfer methods have achieved significant improvement with deep neural networks, they still suffer from artifacts such as disharmonious colors and repetitive patterns. Motivated by this, we propose an internal-external style transfer method with two contrastive losses.
Coarse-to-Fine Structure-Aware Artistic Style Transfer
Liu, Kunxiao, Yuan, Guowu, Wu, Hao, Qian, Wenhua
Artistic style transfer aims to use a style image and a content image to synthesize a target image that retains the same artistic expression as the style image while preserving the basic content of the content image. Many recently proposed style transfer methods have a common problem; that is, they simply transfer the texture and color of the style image to the global structure of the content image. As a result, the content image has a local structure that is not similar to the local structure of the style image. In this paper, we present an effective method that can be used to transfer style patterns while fusing the local style structure into the local content structure. In our method, dif-ferent levels of coarse stylized features are first reconstructed at low resolution using a Coarse Network, in which style color distribution is roughly transferred, and the content structure is combined with the style structure. Then, the reconstructed features and the content features are adopted to synthesize high-quality structure-aware stylized images with high resolution using a Fine Network with three structural selective fusion (SSF) modules. The effectiveness of our method is demonstrated through the generation of appealing high-quality stylization results and a com-parison with some state-of-the-art style transfer methods.
- Europe > Switzerland > Zürich > Zürich (0.14)
- North America > United States > Utah > Salt Lake County > Salt Lake City (0.04)
- North America > United States > Hawaii > Honolulu County > Honolulu (0.04)
- (9 more...)
- Information Technology > Sensing and Signal Processing > Image Processing (1.00)
- Information Technology > Artificial Intelligence > Vision (1.00)
- Information Technology > Artificial Intelligence > Representation & Reasoning (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (0.46)
Artistic Style Transfer with Internal-external Learning and Contrastive Learning
Although existing artistic style transfer methods have achieved significant improvement with deep neural networks, they still suffer from artifacts such as disharmonious colors and repetitive patterns. Motivated by this, we propose an internal-external style transfer method with two contrastive losses. Specifically, we utilize internal statistics of a single style image to determine the colors and texture patterns of the stylized image, and in the meantime, we leverage the external information of the large-scale style dataset to learn the human-aware style information, which makes the color distributions and texture patterns in the stylized image more reasonable and harmonious. In addition, we argue that existing style transfer methods only consider the content-to-stylization and style-to-stylization relations, neglecting the stylization-to-stylization relations. To address this issue, we introduce two contrastive losses, which pull the multiple stylization embeddings closer to each other when they share the same content or style, but push far away otherwise.
Rethink Arbitrary Style Transfer with Transformer and Contrastive Learning
Zhang, Zhanjie, Sun, Jiakai, Li, Guangyuan, Zhao, Lei, Zhang, Quanwei, Lan, Zehua, Yin, Haolin, Xing, Wei, Lin, Huaizhong, Zuo, Zhiwen
Arbitrary style transfer holds widespread attention in research and boasts numerous practical applications. The existing methods, which either employ cross-attention to incorporate deep style attributes into content attributes or use adaptive normalization to adjust content features, fail to generate high-quality stylized images. In this paper, we introduce an innovative technique to improve the quality of stylized images. Firstly, we propose Style Consistency Instance Normalization (SCIN), a method to refine the alignment between content and style features. In addition, we have developed an Instance-based Contrastive Learning (ICL) approach designed to understand the relationships among various styles, thereby enhancing the quality of the resulting stylized images. Recognizing that VGG networks are more adept at extracting classification features and need to be better suited for capturing style features, we have also introduced the Perception Encoder (PE) to capture style features. Extensive experiments demonstrate that our proposed method generates high-quality stylized images and effectively prevents artifacts compared with the existing state-of-the-art methods.
- Asia > China > Zhejiang Province > Hangzhou (0.04)
- Europe > Switzerland > Vaud > Lausanne (0.04)
Curve-based Neural Style Transfer
Chen, Yu-hsuan, Kara, Levent Burak, Cagan, Jonathan
This research presents a new parametric style transfer framework specifically designed for curve-based design sketches. In this research, traditional challenges faced by neural style transfer methods in handling binary sketch transformations are effectively addressed through the utilization of parametric shape-editing rules, efficient curve-to-pixel conversion techniques, and the fine-tuning of VGG19 on ImageNet-Sketch, enhancing its role as a feature pyramid network for precise style extraction. By harmonizing intuitive curve-based imagery with rule-based editing, this study holds the potential to significantly enhance design articulation and elevate the practice of style transfer within the realm of product design. Figure 1: Workflow of the proposed curve-based style transfer method.
- North America > United States > Pennsylvania > Allegheny County > Pittsburgh (0.05)
- North America > United States > Louisiana > Orleans Parish > New Orleans (0.05)
- Information Technology > Sensing and Signal Processing > Image Processing (0.74)
- Information Technology > Artificial Intelligence > Representation & Reasoning (0.71)
- Information Technology > Artificial Intelligence > Vision > Image Understanding (0.64)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks (0.48)
Generating Modern Persian Carpet Map by Style-transfer
Rahmatian, Dorsa, Moshavash, Monireh, Eftekhari, Mahdi, Hoseinkhani, Kamran
Today, the great performance of Deep Neural Networks(DNN) has been proven in various fields. One of its most attractive applications is to produce artistic designs. A carpet that is known as a piece of art is one of the most important items in a house, which has many enthusiasts all over the world. The first stage of producing a carpet is to prepare its map, which is a difficult, time-consuming, and expensive task. In this research work, our purpose is to use DNN for generating a Modern Persian Carpet Map. To reach this aim, three different DNN style transfer methods are proposed and compared against each other. In the proposed methods, the Style-Swap method is utilized to create the initial carpet map, and in the following, to generate more diverse designs, methods Clip-Styler, Gatys, and Style-Swap are used separately. In addition, some methods are examined and introduced for coloring the produced carpet maps. The designed maps are evaluated via the results of filled questionnaires where the outcomes of user evaluations confirm the popularity of generated carpet maps. Eventually, for the first time, intelligent methods are used in producing carpet maps, and it reduces human intervention. The proposed methods can successfully produce diverse carpet designs, and at a higher speed than traditional ways.
- North America > United States (0.14)
- Asia > Middle East > Iran > Kerman Province > Kerman (0.05)
- Europe > Poland (0.04)
- Research Report (0.50)
- Questionnaire & Opinion Survey (0.38)